![]() CONTROL METHOD BASED ON TOUCH AND GESTURE INPUT AND TERMINAL FOR THAT
专利摘要:
CONTROL METHOD BASED ON TOUCH AND GESTURE INPUT AND TERMINAL FOR THAT. A control method based on touch and gesture input for a mobile terminal or a portable display is provided to facilitate the switching operation between an object in response to a gesture input subsequent to a touch input. The method includes detecting a touch input; the selection of at least one object corresponding to the touch input; the detection of a gesture entry; and executing a switch corresponding to the gesture input in a state in which the object is held in a position of the touch input. The invention allows pagination through lists of documents or icons, etc., while retaining the display of the object kept on the touch screen. 公开号:BR102013016792B1 申请号:R102013016792-4 申请日:2013-06-28 公开日:2021-02-23 发明作者:Jinyong KIM;Jinyoung Jeon;Jiyoung KANG;Daesung Kim;Boyoung Lee 申请人:Samsung Electronics Co., Ltd; IPC主号:
专利说明:
BACKGROUND OF THE INVENTION Field of the Invention The present invention relates to a control method based on touch and gesture input and to a portable terminal configured to carry out the control method. More particularly, the present invention relates to a control method based on touch and gesture input for performing a switching operation using a gesture input subsequent to a touch input. Description of the Related Art With the advancement of communication technology and interactive display technology, smart electrical devices, such as smartphones and handheld terminals, such as tablets, etc., employ various means of input to control the smartphone, such as a touch screen, from so that the user controls the device more conveniently. Therefore, studies are being conducted to recognize touch, movement and gesture inputs with the assistance of sensors that can reduce the need to type commands on the relatively small display screen and quickly have commonly requested commands performed by gesture inputs. Technological advances have made it possible for portable terminals to be configured to recognize various types of inputs, user requests for simplified terminal handling growing. However, despite the ability to detect various types of inputs, current conventional terminals are limited in using their input detection capability to control terminal operations, resulting in a failure to match users' needs. SUMMARY OF THE INVENTION The present invention was made, in part, in an effort to resolve some of the drawbacks in the technique, and it is an objective of the present invention to provide a control method based on touch and gesture input and a terminal that performs an operation in response to a series of touch and gesture inputs. It is another object of the present invention to provide a control method based on touch and gesture input and a terminal that switches between objects in response to a gesture input subsequent to an ongoing touch input. According to an example aspect of the present invention, a method for controlling a terminal preferably includes detecting a touch input; the selection of at least one object corresponding to a touch input; the detection of a gesture entry; and executing a switch corresponding to the gesture input in a state in which the object is held in a position of the touch input. Preferably, the object that is held in a touch input position includes at least one of an icon, text, image, file, folder, web browser content, web address and web link. web. Preferably, the selection of at least one object corresponding to the touch input includes the presentation of the object in one of the activated, enlarged, retracted, shaded states. Preferably, the detection of a gesture input comprises detecting the gesture input in a state in which the touch input is maintained. Preferably, the switch corresponding to the gesture input comprises one of a page switch, a folder switch, a tab switch, an application switch and a task switch. Preferably, execution includes maintaining the selected object corresponding to the touch input; and switching between a plurality of pages having at least one object in the state in which the selected object is kept on the screen. Preferably, the execution can also include the maintenance of the selected object corresponding to the touch input; and switching between higher and lower folders along a file path or between folders in a folder list. Preferably, the execution can also include the maintenance of the selected object corresponding to the touch input; and switching between a plurality of hits provided by a web browser. Preferably, the execution can also include the maintenance of the selected object corresponding to the touch input; and a switch between applications or tasks listed in a predetermined list or a list of applications or tasks currently running. Preferably, switching between applications or tasks includes displaying the selected object in a format optimally arranged for the application or task. Preferably, the method according to the present invention further includes detecting a release of the touch input; and the execution of an operation corresponding to the release of the touch input for the selected object. Preferably, according to the present invention, the operation corresponding to the release of the touch input is one of the disposition of the object in a position desired by the touch input, the execution of a link of the selected object in a tab of the web browser, and pasting the object on and running it on the application or task screen. According to another example aspect of the present invention, a terminal includes an input unit which detects touch and gesture inputs; a control unit configured to detect the selection of at least one object corresponding to the touch input on the touch screen display, and to perform a switching of the images shown on the display, corresponding to the gesture input, in a state where the object is kept in a position of the touch input; and a display unit which displays a screen under the control of the control unit. Preferably, the switch is one of a page switch, a folder switch, a tab switch, an application switch and a task switch. Preferably, the control unit is configured to "hold" the selected object corresponding to the touch input and switch between a plurality of pages having at least one object in the state in which the selected object is kept on the screen. Preferably, the control unit is configured to "hold" the selected object corresponding to the touch input and to switch between higher and lower folders along a file path or between folders in a folder list. Preferably, the control unit is configured to "hold" the selected object corresponding to the touch input and to switch between a plurality of tabs provided by a web browser. Preferably, the control unit is configured to "hold" the selected object corresponding to the touch input and switch between applications or tasks listed in a predetermined list or a list of applications or tasks currently running. Preferably, the control unit is configured to display the selected object in a format optimally arranged for the application or task. Preferably, the input unit detects a release of the touch input, and the control unit performs one of the arrangement of the object in a position desired by the touch input, executing the link of the selected object in a tab of the web browser, and pasting the object on an application or task execution screen. In addition, a method for controlling a terminal preferably comprises: detecting a touch input by a sensor (111- 116) on a touch screen display (140); the detection by a control unit (120) of the terminal of a selection of at least one object from a plurality of objects corresponding to the touch input on the touch screen display; detecting a gesture input in a state in which the touch input is maintained for at least a partial time overlap with the detection of the gesture; and performing a switching of a display of one or more of the plurality of other objects in addition to at least one object (141), which is being held in the same position on the touch screen display (140) and corresponding to a direction associated with the gesture input in a state in which at least one object (141) is being held in a position of the touch input during the detection of the gesture input on the touch screen display (140). BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 is a block diagram illustrating a terminal configuration in accordance with an exemplary embodiment of the present invention; figure 2 is a front view of a terminal equipped with a camera sensor and an infrared sensor; figure 3 is a flow chart illustrating the method for controlling the terminal according to an example embodiment of the present invention; Figure 4 is a diagram illustrating an example touch input action for use in an example embodiment of the present invention; Figure 5 is a diagram showing a combination of touch and gesture inputs for use in an example embodiment of the present invention; figure 6 is a diagram illustrating an example page switching operation based on the combination of touch and gesture inputs according to an example embodiment of the present invention; Figure 7 is a diagram illustrating an example folder switching operation based on the combination of touch and gesture inputs according to an example embodiment of the present invention; figure 8 is a diagram illustrating an example flap switching operation based on the combination of touch and gesture inputs according to an example embodiment of the present invention; and Figure 9 is a diagram illustrating an application switching operation or example task based on the combination of touch and gesture inputs according to an example embodiment of the present invention. DETAILED DESCRIPTION The present invention is suitable for many uses, one of which includes the control of a terminal enabled for touch and gesture inputs. The present invention is applicable to all types of terminals enabled for 'touch and gesture inputs, including a smartphone, a portable terminal, a mobile terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a laptop, a notepad, a Wibro terminal, a tablet PC, a smart TV, a smart refrigerator, and their equivalents, just to name a few non-limiting examples. The terminology used here is for the purpose of illustration for a person skilled in the art of particular example modalities only and is not limiting to the claimed invention. Unless otherwise defined, all terms used herein have the same meaning as commonly understood by someone of ordinary skill in the art to which this invention refers, and should not be construed as having an overly comprehensive meaning, nor as having an overly understandable meaning. excessively contracted meaning. Nor dictionary definitions from dictionaries of general subject should contradict the understanding of any terms as known in the art for persons of ordinary knowledge. As used here, the singular forms "one", "one" and "o (a)" are understood to include plural forms in the same way, unless the context clearly indicates otherwise. It should be further understood that the terms "comprises", "comprising", "includes" and / or "including", when used here, specify the presence of declared resources, members, steps, operations, elements and / or components, but not prevent the presence or addition of one or more other resources, steps, operations, elements, components and / or groups thereof. Furthermore, the term "touch", as used here, includes a part of the user's body (for example, hand, finger) and / or a physical object, such as a pointing pen reaching a predetermined distance from the touch screen, without making physical contact. In addition, the terms "maintained" and "maintain" must be interpreted broadly and do not require a part of the user's body (such as a finger or fingers) or the pen to remain in contact or almost in contact with an object on the screen, while a gesture is performed to cause the switching of pages, applications, tabs, etc. For example, a single or double strike of an object can designate the object to be "held", and then a movement of gestures or movements can change pages or applications while the object remains "held" in a designated position. In this case, where the selected object is not being held by a finger or a pen, then a "release" can include a subsequent movement or touch to indicate that the object is released. The exemplary embodiments of the present invention are now described with cooling to the associated drawings in detail. Figure 1 is a block diagram illustrating a terminal configuration according to an example embodiment of the present invention. As shown in figure 1, terminal 100 preferably includes an input unit 110, a control unit 120, a storage unit 130 and a display unit 140. Input unit 110 can generate manipulation in response to user input. Input unit 110 preferably can include one or more of a touch sensor 111, a proximity sensor, an electromagnetic sensor 113, a camera sensor 114 and an infrared sensor. The touch sensor 111 detects a touch input made by the user. The touch sensor 111 can be implemented with one of a touch film, a touch sheet and a touchpad. The touch sensor 111 can detect a touch input and generate a corresponding touch signal, which is extracted to the control unit 120. The control unit 120 can analyze the touch signal to perform a function corresponding to it. The touch sensor 111 can be implemented for the detection of touch input made by the user through the use of various input means. For example, touch input can constitute the detection of a part of the user's body (for example, the hand) and / or a physical object, such as a pointing pen and an equivalent manipulation button. The touch sensor 111 can preferably detect the approach of an object in a predetermined range, as well as a direct touch according to the implementation. With continued reference to figure 1, the proximity sensor 113 is configured to detect the presence / absence, the approach, the movement, the direction of movement, the speed of movement and the shape of an object using the intensity of the non-contact electromagnetic field physicist. The proximity sensor 113 is preferably implemented with at least one of a transmission type photosensor, a direct reflection type photosensor, a mirrored reflection type photosensor, a high frequency oscillation type proximity sensor, a sensor capacitive proximity sensor, a magnetic type proximity sensor and an infrared proximity sensor. The electromagnetic sensor 114 detects a touch or an approach of an object based on the variation of the intensity of the electromagnetic field, and can be implemented in the form of an electromagnetic resonance (EMR) entry pad or electromagnetic interference (EMI). The electromagnetic sensor 114 is preferably implemented with a coil inducing a magnetic field and detects the approach of an object having a resonance circuit that causes a variation of energy in the magnetic field generated by the electromagnetic sensor 114. The electromagnetic sensor 114 can detect the input, for example, by means of a pointing pen as an object having the resonance circuit. Electromagnetic sensor 114 can also detect proximity or hovering input made close to terminal 100. The camera sensor 115 converts an image input (light) through a lens into a digital signal by means of charge-coupled devices (CCD) or complementary metal oxide semiconductor (CMOS). Camera sensor 115 is capable of storing the digital signal in storage unit 130 either temporarily or permanently. The camera sensor 115 is able to locate and track a specific point in a recognized image to detect a gesture input. Referring now to Figure 2, camera sensor 115 may include combined lenses facing its front and / or rear surface for capturing and converting an image through the lenses. An infrared sensor 116, which is also referred to as an IR sensor or an LED sensor, and can include a light source for emitting infrared light for an object and a light receiver for receiving reflected light from the object (for example, the hand) approaching terminal 100. The infrared sensor 116 can detect the amount of variation in light received by the light receiver, in order to check the movement of air and distance from the object. With reference again to figure 2, the infrared sensor 116 is arranged in front and / or at the rear side of terminal 100, in order to receive the infrared light emitted from the outside of terminal 100 and / or reflected by a part of the user's body (for example, by hand). According to an exemplary embodiment of the present invention, input unit 110 can detect touch and gesture inputs by the sensor. The input unit 110 can detect the touch and gesture inputs made simultaneously or sequentially, and the gesture input subsequent to the touch input in progress. The control unit 120, which is comprised of hardware, such as a processor or microprocessor configured to control some or all of the general operations of the terminal with the components. For example, control unit 120 preferably controls the operation and functions of terminal 100 according to input made through input unit 110. According to an exemplary embodiment of the present invention, the control unit 120 is configured to control switching based on the detection of touch and gesture inputs from one or more sensors. For example, the switch can comprise any of a page switch, a folder switch, a tab switch, an application switch, and a task switch. The detailed operations of the control unit 120 are now described in greater detail hereinafter, with reference to the associated drawings. Storage unit 130 is preferably used for storing programs and commands for terminal 100. Control unit 120 is configured to execute programs and commands that can be stored in storage unit 130. The storage unit 130 can comprise at least one of a flash memory, a hard disk, a multimedia micro card, a card type memory (for example, an SD or XD memory), a random access memory (RAM), a static RAM (SRAM), a read-only memory (ROM), an electrically erasable programmable ROM (EEPROM), a programmable ROM (PROM), a magnetic memory, a magnetic disk, and an optical disk. According to an exemplary embodiment of the present invention, storage unit 130 can be used to store at least one of an icon, text, image, file, folder and various forms of content including objects, application and service functions. According to an exemplary embodiment of the present invention, the storage unit can store information about the operation corresponding to the entry made through the input unit 110. For example, the storage unit 130 can be used for storing information about the switching operations corresponding to the touch and gesture inputs. With continued reference to figure 1, display unit 140 displays (extracts) certain information processed at terminal 100. For example, display unit 140 may display the related user interface (UI) or a graphical user interface (GUI) voice detection, status recognition and function control. The display unit can be implemented with at least one of a liquid crystal display (LCD), a thin film transistor LCD (TFT LCD), an organic light emitting diode (OLED), a flexible display, and a three-dimensional (3D) display. The display unit 140 forms a touch screen with the touch sensor as part of the input unit 110. The touch screen-enabled display unit 140 can operate as both an input device and an output device. According to an exemplary embodiment of the present invention, display unit 140 can preferably display any of icons, texts, images, file lists and folder lists. Display unit 140 can display at least one of a web browser and contents, website address and website link. According to an example embodiment of the present invention, the display unit 140 can display at least one object dynamically, according to the switching operation under the control of the control unit 120. For example, the display unit 140 can display at least one object moving in a certain direction on the screen, according to a page switching operation. Although the present description is directed to a terminal described in figure 1, the terminal can still include components other than those shown, and / or some of the components constituting the terminal can be deleted. Figure 3 is a flow chart illustrating the method for controlling the terminal according to an example embodiment of the present invention. Referring now to Figure 3, an example method for controlling the terminal according to the presently claimed invention will be discussed here below. In step 210, terminal 100 determines whether or not a touch input is detected. Terminal 100 can detect more than one touch input made sequentially or simultaneously. According to the implementation, terminal 100 can be configured to detect different types of touch, such as a proximity-based input or a pressure-based input, as well as the touch-based input. Therefore, the term touch is broad, since the detection of a relatively close contact by a finger or a detectable pointer that can be detected by a proximity sensor can be considered to constitute a touch. If a touch is detected in step 210, then in step 220, terminal 100 will select an object. Terminal 100 can be configured to determine the position where the touch is made on the display. For example, terminal 100 can determine two-dimensional or three-dimensional coordinates of the position where the touch is made on the screen. Furthermore, terminal 100 can be configured to check the pressure, duration and movement of the touch (for example, drag, variation of the distance between multiple touch points and pattern of touch movement). In addition, terminal 100 can select at least one object corresponding to the touch input. Terminal 100 can be configured to detect at least one object (141) located at the position where the touch input is made. The object can be any one of an icon, text, image, file, folder, web content, web address and web link. Terminal 100 can display the selected object as activated, enlarged, contracted or shaded. Terminal 100 can display the selected object as activated, enlarged, retracted or shaded according to the length of time for which the touch input was maintained. Referring now to the example case in figure 4, terminal 100 is operated to select an icon according to the touch input detected on the screen in idle mode. Terminal 100 can display the selected icon (141) in shaded form. In the event that a touch is made and then moved, terminal 100 displays the movement of the selected object (141) according to the movement. For example, if a touch is detected and moved in a certain direction, terminal 100 can display the movement of the selected object in the same direction. Terminal 100 can express / display the state of movement of the selected object. For example, terminal 100 may display the selected object with an additional indicator or visual effect, such as vibration, enlargement, retraction, or shading to express that the object is in a mobile state. Then, with reference to the flowchart of figure 3, in step 230, terminal 100 determines whether a gesture input is detected. The terminal 100 can detect a scanning gesture in a certain direction, a drawing gesture for a certain shape and a shaping gesture for the formation of a certain shape. Terminal 100 can detect a gesture input direction, speed, shape and distance from terminal 100. According to a particular implementation, terminal 100 can detect an approach input or a pressure input, at the instead of gesture input. Terminal 100 detects the gesture input in the state in which the touch input is maintained. With reference to the example case of figure 5, terminal 100 detects a touch input and a subsequent gesture input made in the state in which the touch input is maintained. If, in step 230, the gesture input is detected, then, in step 240, terminal 100 will perform a switching operation. More particularly, terminal 100 performs the switching operation corresponding to the detected gesture input. Terminal 100 searches for the switching operation combined with the gesture input and, if the switching is recovered, will execute the corresponding switching operation. The switching operation can comprise any of a page switching operation, a folder switching operation, a flap switching operation, an application switching operation, and a task switching operation, just to name a few non-limiting possibilities. The page switching operation can be performed so that the current page is switched to another page with the exception of the selected object (141). The page switching operation can be performed through the display screen in which a plurality of pages, each having at least one object, is turned one by one in response to a user request. For example, the page switching operation can be performed on the screen in idle mode, on the file or folder list screen, a selectable menu list screen, a document screen, an e-book screen, a screen phone book, etc., just to name a few non-limiting possibilities. The terminal 100 can perform a page switch in such a way that, when the current page has a plurality of objects, with the exception of the selected object, the display is switched to another page in the state in which the selected object is fixed by the input of Touch. In other words, terminal 100 turns the current page with the unselected objects (which may include the background image) to the next page on the screen in a horizontal or vertical direction on the screen, while the object selected by the touch input remains in a fixed position on the display. At this time, the page turning direction and the number of page turning can be determined according to the direction of the gesture (for example, horizontal or vertical) or the shape of the gesture (for example, the shape of the hand expressing a certain number). According to the page switching operation, the objects on the previously displayed page disappear, except for the object being kept, and the objects on the new page appear on the screen. In the event that there are no other pages corresponding to the gesture input, terminal 100 can deviate from turning the page or displaying a message, an icon or an image notifying that there are no other pages. With reference to the example case in figure 6, terminal 100 selects an icon on the screen inactive in response to a touch input. At this point, terminal 100 displays the selected icon in shaded form. The terminal 100 can detect a subsequent management input. The gesture input can comprise any detectable movement, but, in this example, it comprises a sweep in the direction from right to left. Terminal 100 can perform the page switching operation while fixing the selected icon in response to touch input. In other words, the terminal. 100 turns the page in the direction corresponding to the direction of the gesture. As terminal 100 moves objects to the left, with the exception of the selected object, this other page appears from the right side. As shown in figure 6, at least one object 141 is being held in one position of the touch input during a gesture input detection. A technician must understand and appreciate that the term "during" may constitute a temporal overlap (that is, a period of overlapping time) between the touch of the object and the detection of gesture input, and it is not an absolute requirement in some modalities that the object is maintained while a recognized gesture is made for page changes, for example. The folder switching operation comprises navigation between folders based on the file path of the selected object. The folder switching operation can be performed between files or folders. For example, folder switching can be performed between folders including documents, images, photos, e-books, music files, application execution files or shortcut icons, program execution files or shortcut icons, execution files or shortcuts. For example, you can keep or designate a photo and then, with a recognized gesture, switch between applications, so that the photo can be inserted into an email, text, on Facebook, virtually any type of application. communication that allows the transmission of images. The terminal determines the file path of the selected object kept corresponding to the touch input. Terminal 100 can move a folder to a higher or lower level folder along the file path, or a previous or next folder in a folder list. At this point, a decision as to whether to move the folder from the highest or lowest folder level or whether to move to a previous or next folder in the next folder at the same level can be determined according to the direction (horizontal or vertical) of the gesture or the shape of the gesture (for example, the shape of the hand indicating a certain number). According to the folder switching, objects in the old folder disappear and objects in the new folder appear on the screen. In the case where there is no other folder corresponding to the gesture input, terminal 100 bypasses the folder switching operation or displays a message, icon or image notifying that there is no other folder. Referring now to the example case in figure 7, terminal 100 selects a photo in the Album 1 folder corresponding to a touch input. At this point, terminal 100 displays the selected folder in shaded form. Terminal 100 can detect a subsequent management entry. Terminal 100 can also detect subsequent gesture input while the touch input is maintained. The gesture entry can be, for example, a scan gesture entry made in a direction from right to left. Terminal 100 can perform the folder switching operation while keeping the selected photo in the position corresponding to the touch input. In other words, terminal 100 can move the folder in the direction corresponding to the gesture input. The terminal 100 controls the operation so that the objects included in the Album 1 folder, with the exception of the selected object, from the screen and then a list of the photos included in the next folder, that is, the Album 2 folder, appear on the screen. . The tab switching operation comprises navigation between tabs representing the respective applications or programs. The tab switching operation can be performed between web browser tabs, a menu window, an e-book and / or document viewer applications or programs. Terminal 100 can hold an object corresponding to a touch input and perform the flap switching operation. In other words, terminal 100 can move the current tab or at least one object included in the current tab in a horizontal or vertical direction in relation to another tab or be placed on another tab. At this time, the flap switching direction and the number of switching operations can be determined according to the direction (horizontal or vertical) or the shape of the gesture. According to the flap switching operation, objects on one tab disappear and other objects on another tab appear on the screen. In the case where there are no other tabs corresponding to the gesture input, terminal 100 bypasses the tab switching operation and displays a message, icon, or image notifying that there is no target tab. With reference to the example case in figure 8, terminal 100 can select at least one of the objects shown in the current tab of the web browser screen corresponding to a touch input. The object can comprise a web page address, or text, an image, an icon or a flash including a link to a certain web page. At this point, terminal 100 displays the selected object, which changes color, font, line thickness, size and shadow effect. The terminal can detect a gesture input subsequent to the touch input in progress. The gesture input can comprise a scan gesture input made in a left-to-right direction. The terminal performs the flap switching operation while the selected object is held in a position on the display according to the touch input. In other words, the terminal moves the tab in the direction corresponding to the gesture input direction. Terminal 100 controls so that objects on the old tab, with the exception of the selected object, disappear and objects belonging to another tab on the web browser screen appear together with the selected object. A. application or task switching operation comprises a switching between application execution screens or tasks for moving a selected object. Switching an application or task can be performed among the different applications or tasks predetermined by the user or terminal manufacturer, or among the applications or tasks that are currently running. Terminal 100 receives and stores a list of applications or tasks available for switching that are provided by the user or the terminal manufacturer. Terminal 100 identifies the applications or tasks currently running and performs the switching operation based on preferences, usage frequencies and operating times for the respective applications or tasks. The application or task can be any one of sending messages, SMS, e-mail, memo and application or call task, just to name some non-limiting possibilities. In accordance with this aspect of the present invention, terminal 100 performs the application or task switching operation with the objects, except for the object selected on the screen, while maintaining the object selected by the touch input. In other words, terminal 100 moves objects (which can include the background image) in a horizontal or vertical direction to display another application or task window on the screen, while keeping the selected object in the position corresponding to the entry touch. At this time, the switching direction and the number of switching times can be determined according to the direction (horizontal or vertical) or the shape (for example, the shape of the hand symbolizing a certain number) of the gesture input. According to the application or task switching operation, the application or task and objects belonging to it disappear and another application or task and objects belonging to it appear on the screen. In the event that no other applications or tasks are targeted by gesture input, terminal 100 displays a message, icon or image notifying that there is no target application or task to display. Referring now to the example case in figure 9, terminal 100 selects an image targeted by a touch input. At this point, the terminal processes the image as an enlarged, retracted, shaded or vibrating shape. Terminal 100 detects a gesture input. Terminal 100 is capable of the gesture input subsequent to the touch input in progress. The gesture entry can be a scan gesture entry made in the right-to-left direction. Terminal 100 performs the application or task switching operation while maintaining the selected image in a position of the touch input. Terminal 100 performs the switching operation in the direction of the gesture input. Terminal 100 controls so that objects with the exception of the selected image move left to disappear and then objects belonging to the previous or next task appear on the screen. Terminal 100 displays the selected object, in association with the application or task switching, in the optimized format for the target application or task. Terminal 100 presents a preview image of the selected object in the optimized format for adding, inserting, pasting and attaching to the target application or task. The terminal 100 displays the object enlarged, retracted, rotated or changed in the extension in the resolution or together with a text, an image, or an icon indicating addition, insertion, collage or attachment. Referring now to the example case in figure 9, if an application switch to the text messaging application is performed, terminal 100 will display the selected image in an attached format in the message entry window. At this point, terminal 100 displays an icon to warn you of attaching the image to the text message. If the application switching operation is performed for the email application, terminal 100 will display the selected image in the attached format in the mail composition window. The terminal 100 displays the image in the format attached to the mail together with the code for attaching the image, such as html and xml. At this time, terminal 100 displays at least one of the file name, an icon, a file attachment menu for notification of attaching the image file to the email. Then, terminal 100 determines in step 250 whether touch input is made. After executing the switching operation or if no gesture input is detected, terminal 100 determines whether a touch input is terminated. It is determined that the touch input is terminated when the touch input detects that the user releases contact from a touch screen input device of terminal 100. If the touch input is not finished, the terminal will repeat the switching operation corresponding to the gesture input detection. If the user releases the contact from the input device of terminal 100, the switching operation of terminal 100 will then be terminated. If the touch input is not completed, terminal 100 will repeat the switching operation corresponding to the gesture input detection. Otherwise, if the touch input is terminated, terminal 100 will terminate the procedure at step 260. Termination operations can comprise any of the alignment of the selected object in a target position by touch input, executing the link of the selected object in the web browser tab, and pasting the object on the application or task execution screen. In the example mode of figure 6, if the touch input is finished, terminal 100 will arrange the selected icon in the position where the touch was released. In order to place the icon in the position desired by the touch input, the terminal can aim to move or rearrange the icons on the page. Terminal 100 can also store information about the rearranged page. In the event that an icon or other item is designated by a tap or a non-contact pointer, for example, which is recognized by the touch screen as designating the particular icon or other item to remain stationary while a gesture, such as a movement scan moves through applications, screens, etc., since an icon or item like this, in this example, is not being maintained, another recognized act, such as a double hit or another hit, or a movement , it may signal that the icon or other item is no longer designated. In the example mode shown in figure 7, if the touch input is terminated, terminal 100 will arrange the page by placing the selected image in the position where the touch was released. Terminal 100 rearranges the list of images in the navigated folder for insertion of the image. Terminal 100 moves and stores the selected image in the corresponding folder or address and updates the information in the folder or image. In the example mode in figure 8, if the touch input is terminated, terminal 100 will add, insert, paste or attach the selected object on the application or task execution screen where the touch input was terminated. Terminal 100 attaches the selected image to a text message and inserts the selected image into the message composition window. Terminal 100 also performs a text composition mode and attaches the selected image to the text composition window to post the selected image to an SNS site. The configuration of the terminal 100 is not limited to the example modalities described above, but can be modified to perform various operations in response to the detection of the termination of the touch input, without departing from the scope of the present invention. The control method based on touch and gesture input and the terminal for this according to the present invention facilitate the control of the terminal operations with the combination of the intuitive touch and gesture inputs made in the improved input interface. The methods described above in accordance with the present invention can be implemented in hardware, firmware or as software or computer code loaded in hardware, such as a processor or microprocessor and executed, the machine executable code being stored in a storage medium. such as a CD-ROM, a RAM, a floppy disk, a hard disk or a magnetic-optical disk or a computer code transferred (via download) over a network originally stored on a remote recording medium or a medium that it can be read on a non-transitory machine and to be stored on a non-transitory local recording medium, so that the methods described here can be presented in that software that is stored in the recording medium, using a general purpose computer, or a special processor or on programmable or dedicated hardware, such as an ASIC or an FPGA. As would be understood in the art, the computer, the processor, a microprocessor controller or the programmable hardware includes memory components, for example, a RAM, a ROM, a flash, a miniature, etc., which can store or receive software or computer code, which, when accessed and executed by the computer, processor or hardware, implements the processing methods described here. In addition, it would be recognized that when a general-purpose computer accesses code for implementing the processing shown here, executing the code would transform the general-purpose computer into a special-purpose computer for performing the processing shown here. Although example embodiments of the present invention have been described in detail here above with specific terminology, this is for the purpose of describing 5 example embodiments in particular only and not intended to be limiting to the invention. Although particular exemplary embodiments of the present invention have been illustrated and described, it would be obvious to those skilled in the art that various other changes and modifications can be made, without departing from the spirit and scope of the invention.
权利要求:
Claims (15) [0001] 1. Method for controlling a terminal, characterized by the fact that it comprises: the detection of a touch input on a touch screen display; detecting a selection of at least one object from a plurality of objects corresponding to the touch input of the touch screen display; the detection of a gesture input by a sensor in a state in which the touch input of at least one object is maintained on the touch screen for at least a period of overlap when the gesture is detected; and performing a switching of a display of one or more of the plurality of other objects in addition to at least one object which is kept on the touch screen display and corresponding to the gesture input in a state in which the object is kept in a position of the touch input, in which a switching direction is determined according to a direction of the gesture input and the number of times the switching is determined according to a shape of the gesture input. [0002] 2. Method, according to claim 1, characterized by the fact that at least one object kept on the touch screen display comprises at least one of an icon, a text, an image, a file, a folder, the contents of a web browser, a web address and a web link. [0003] 3. Method according to claim 1, characterized in that the switching of objects being displayed while at least one object is being held comprises at least one of a page switch, a folder switch, a flap switch, a switch application switching and task switching. [0004] 4. Method, according to claim 1, characterized by the fact that the execution of a switch comprises: the detection of at least one object corresponding to the touch input; and switching between a display of a plurality of pages having at least one object in the state of being maintained on the touch screen display. [0005] 5. Method, according to claim 1, characterized by the fact that it still comprises: the detection by the touch sensor of a release of the touch input of the selected object being kept on the touch screen display; and the execution by the control unit of an operation corresponding to the release of the touch input for the selected object. [0006] 6. Method, according to claim 5, characterized by the fact that the operation corresponding to the release of the touch input comprises at least the execution of one of the disposition of the object in a position corresponding to the release of the touch input, the execution of a link to the selected object in a web browser tab, and pasting the object to an application or task execution screen. [0007] 7. Method according to claim 5, characterized in that the object comprises at least one image that is pasted into or attached to a text message or an e-mail message. [0008] 8. Method, according to claim 5, characterized by the fact that the object comprises at least one image that is pasted into or attached to a social networking application. [0009] 9. Method, according to claim 1, characterized by the fact that the gesture includes the shape of one hand. [0010] 10. Terminal, characterized by the fact that it comprises: an input unit which detects touch and gesture inputs; a control unit which detects a selection of at least one object from a plurality of objects corresponding to a touch input and performs a switching of a display of one or more of the plurality of other objects in addition to at least one object, corresponding to the gesture input, in a state in which the selected object is held in a position of the touch input of the input unit; and a display unit which displays a screen under the control of the control unit, where the control unit determines a switching direction according to a direction of the gesture input and the number of times the switching is determined according to a form of gesture input. [0011] 11. Terminal, according to claim 10, characterized by the fact that the switching performed is one among a page switching, a folder switching, a flap switching, an application switching and a task switching. [0012] 12. Terminal, according to claim 10, characterized by the fact that the input unit detects a release of the touch input, and the control unit executes one of the positions of the selected object in 5, a position corresponding to the release of the input. touch, the execution of a link of the selected object in a tab of the web browser, and the pasting of the selected object in an application or task execution screen. [0013] 13. Terminal, according to claim 12, 10 characterized by the fact that the selected object comprises at least one image that is pasted into or attached to a text message or an e-mail message. [0014] 14. Terminal, according to claim 12, characterized by the fact that the selected object comprises at least one image that is pasted into or attached to a social networking application. [0015] 15. Terminal, according to claim 10, characterized by the fact that the gesture includes the shape of one hand.
类似技术:
公开号 | 公开日 | 专利标题 BR102013016792B1|2021-02-23|CONTROL METHOD BASED ON TOUCH AND GESTURE INPUT AND TERMINAL FOR THAT US10635299B2|2020-04-28|Device, method, and graphical user interface for manipulating windows in split screen mode US9990107B2|2018-06-05|Devices, methods, and graphical user interfaces for displaying and using menus US9645732B2|2017-05-09|Devices, methods, and graphical user interfaces for displaying and using menus DK179374B1|2018-05-28|Handwriting keyboard for monitors US9367161B2|2016-06-14|Touch sensitive device with stylus-based grab and paste functionality BR112013006616B1|2021-03-16|apparatus and method for detecting an object based on proximity to the input surface, associated item of information and distance from the object BR112012008792B1|2021-03-23|CONTENT LIMIT SIGNALING TECHNIQUES US9891813B2|2018-02-13|Moving an image displayed on a touchscreen of a device EP2664986A2|2013-11-20|Method and electronic device thereof for processing function corresponding to multi-touch WO2014197745A1|2014-12-11|One handed gestures for navigating ui using touchscreen hover events US9864514B2|2018-01-09|Method and electronic device for displaying virtual keypad EP2717133A2|2014-04-09|Terminal and method for processing multi-point input US20140149923A1|2014-05-29|Information processing apparatus, display apparatus, method for controlling information processing apparatus, and program US10921975B2|2021-02-16|Devices, methods, and user interfaces for conveying proximity-based and contact-based input events US20180349659A1|2018-12-06|Device, Method, and Graphical User Interface for Handling Data Encoded in Machine-Readable Format US10402080B2|2019-09-03|Information processing apparatus recognizing instruction by touch input, control method thereof, and storage medium KR101325535B1|2013-11-07|Method, terminal, and recording medium for controlling screen JP2015225126A|2015-12-14|Information processor, method and program BR112015000735B1|2021-11-30|PORTABLE TERMINAL USING TACTILE PEN AND MANUSCRIPT ENTRY METHOD USING THE SAME
同族专利:
公开号 | 公开日 EP2687971A3|2017-04-19| EP2687971A2|2014-01-22| AU2013206192A1|2014-01-30| BR102013016792A2|2015-08-25| JP2014021983A|2014-02-03| RU2013129862A|2015-01-10| KR101984154B1|2019-05-30| CN103543943B|2018-11-23| US20140019910A1|2014-01-16| CA2818248A1|2014-01-16| AU2013206192B2|2018-08-30| TW201411469A|2014-03-16| TWI594178B|2017-08-01| US20180136812A1|2018-05-17| KR20140010596A|2014-01-27| CN103543943A|2014-01-29| JP6230836B2|2017-11-15|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JPH0876926A|1994-09-02|1996-03-22|Brother Ind Ltd|Picture display device| US8139028B2|2006-02-01|2012-03-20|Synaptics Incorporated|Proximity sensor and method for indicating extended interface results| US8086971B2|2006-06-28|2011-12-27|Nokia Corporation|Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications| US7877707B2|2007-01-06|2011-01-25|Apple Inc.|Detecting and interpreting real-world and security gestures on touch and hover sensitive devices| JP4171770B1|2008-04-24|2008-10-29|任天堂株式会社|Object display order changing program and apparatus| US8631340B2|2008-06-25|2014-01-14|Microsoft Corporation|Tab management in a user interface window| US8762879B1|2008-09-01|2014-06-24|Google Inc.|Tab management in a browser| JP5279646B2|2008-09-03|2013-09-04|キヤノン株式会社|Information processing apparatus, operation method thereof, and program| US8756519B2|2008-09-12|2014-06-17|Google Inc.|Techniques for sharing content on a web page| JP2010157189A|2009-01-05|2010-07-15|Sony Corp|Information processor, information processing method and program| TW201101198A|2009-06-17|2011-01-01|Sonix Technology Co Ltd|Command input method| US8471824B2|2009-09-02|2013-06-25|Amazon Technologies, Inc.|Touch-screen user interface| CN102023784A|2009-09-16|2011-04-20|创新科技有限公司|Method and equipment for inputting characters in non-contact mode| US8769428B2|2009-12-09|2014-07-01|Citrix Systems, Inc.|Methods and systems for generating a combined display of taskbar button group entries generated on a local machine and on a remote machine| US10007393B2|2010-01-19|2018-06-26|Apple Inc.|3D view of file structure| US8707174B2|2010-02-25|2014-04-22|Microsoft Corporation|Multi-screen hold and page-flip gesture| CN102200830A|2010-03-25|2011-09-28|夏普株式会社|Non-contact control system and control method based on static gesture recognition| US9170708B2|2010-04-07|2015-10-27|Apple Inc.|Device, method, and graphical user interface for managing folders| CN102033710B|2010-04-07|2015-03-11|苹果公司|Method for managing file folder and related equipment| KR20110127853A|2010-05-20|2011-11-28|엘지전자 주식회사|Mobile terminal and method for controlling the same| JP5556515B2|2010-09-07|2014-07-23|ソニー株式会社|Information processing apparatus, information processing method, and program| TW201216090A|2010-10-13|2012-04-16|Sunwave Technology Corp|Gesture input method of remote control| JP2012108800A|2010-11-18|2012-06-07|Ntt Docomo Inc|Display device, control method for display device and program| KR101932688B1|2010-11-29|2018-12-28|삼성전자주식회사|Portable Device and Method for Providing User Interface Mode thereof| CN102043583A|2010-11-30|2011-05-04|汉王科技股份有限公司|Page skip method, page skip device and electronic reading device| TW201224843A|2010-12-03|2012-06-16|Microlink Comm Inc|Paging method for electronic book reading device| KR101892630B1|2011-01-10|2018-08-28|삼성전자주식회사|Touch display apparatus and method for displaying thereof| US9778747B2|2011-01-19|2017-10-03|Hewlett-Packard Development Company, L.P.|Method and system for multimodal and gestural control| US20120218203A1|2011-02-10|2012-08-30|Kanki Noriyoshi|Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus| US20120304059A1|2011-05-24|2012-11-29|Microsoft Corporation|Interactive Build Instructions| US20130067392A1|2011-09-12|2013-03-14|Microsoft Corporation|Multi-Input Rearrange| KR101381484B1|2012-02-29|2014-04-07|주식회사 팬택|Mobile device having a graphic object floating function and execution method using the same| US20130268837A1|2012-04-10|2013-10-10|Google Inc.|Method and system to manage interactive content display panels| US9740393B2|2012-05-18|2017-08-22|Google Inc.|Processing a hover event on a touchscreen device|US7877707B2|2007-01-06|2011-01-25|Apple Inc.|Detecting and interpreting real-world and security gestures on touch and hover sensitive devices| EP3132330B1|2014-04-16|2019-07-03|Neodrón Limited|Determining touch locations and forces thereto on a touch and force sensing surface| CN102830909B|2012-07-18|2015-11-25|华为终端有限公司|A kind of icon management method of user interface and touch control device| KR102117450B1|2013-03-26|2020-06-01|삼성전자주식회사|Display device and method for controlling thereof| USD731553S1|2013-07-31|2015-06-09|Sears Brands, L.L.C.|Display screen or portion thereof with an icon| USD731551S1|2013-08-01|2015-06-09|Sears Brands, L.L.C.|Display screen or portion thereof with an icon| USD734345S1|2013-08-01|2015-07-14|Sears Brands, L.L.C.|Display screen or portion thereof with an icon| WO2015022498A1|2013-08-15|2015-02-19|Elliptic Laboratories As|Touchless user interfaces| CN104423789B|2013-09-09|2018-07-06|联想有限公司|A kind of information processing method and electronic equipment| US9531722B1|2013-10-31|2016-12-27|Google Inc.|Methods for generating an activity stream| US9542457B1|2013-11-07|2017-01-10|Google Inc.|Methods for displaying object history information| US9614880B1|2013-11-12|2017-04-04|Google Inc.|Methods for real-time notifications in an activity stream| US10156976B2|2014-01-30|2018-12-18|Samsung Display Co., Ltd.|System and method in managing low-latency direct control feedback| US9509772B1|2014-02-13|2016-11-29|Google Inc.|Visualization and control of ongoing ingress actions| CN104914982B|2014-03-12|2017-12-26|联想有限公司|The control method and device of a kind of electronic equipment| CN103902156B|2014-03-17|2017-09-01|联想有限公司|A kind of information processing method and electronic equipment| DE102014004177A1|2014-03-22|2015-09-24|Audi Ag|A method and apparatus for providing a choice during a build of display content| DE102014208502A1|2014-05-07|2015-11-12|Volkswagen Aktiengesellschaft|User interface and method for switching between screen views of a user interface| US9536199B1|2014-06-09|2017-01-03|Google Inc.|Recommendations based on device usage| US9507791B2|2014-06-12|2016-11-29|Google Inc.|Storage system user interface with floating file collection| US10078781B2|2014-06-13|2018-09-18|Google Llc|Automatically organizing images| CN105335043A|2014-08-08|2016-02-17|宏碁股份有限公司|Window switching method and electronic apparatus executing same| US9787812B2|2014-08-28|2017-10-10|Honda Motor Co., Ltd.|Privacy management| JP6388203B2|2014-09-08|2018-09-12|任天堂株式会社|Electronics| TWI511031B|2014-10-23|2015-12-01|Qisda Corp|Electronic device operating method and electronic device| US20160139662A1|2014-11-14|2016-05-19|Sachin Dabhade|Controlling a visual device based on a proximity between a user and the visual device| US9870420B2|2015-01-19|2018-01-16|Google Llc|Classification and storage of documents| CN104598113B|2015-01-23|2018-02-27|联想有限公司|The method and electronic equipment of a kind of information processing| KR20160099399A|2015-02-12|2016-08-22|엘지전자 주식회사|Watch type terminal| CN104715026A|2015-03-03|2015-06-17|青岛海信移动通信技术股份有限公司|Folder management method and intelligent terminal| JP6534292B2|2015-04-24|2019-06-26|パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America|Head mounted display and control method of head mounted display| USD763898S1|2015-07-28|2016-08-16|Microsoft Corporation|Display screen with animated graphical user interface| CN105718768B|2016-01-12|2019-06-11|Oppo广东移动通信有限公司|A kind of method and apparatus for preventing icon maloperation| CN107203319A|2016-03-17|2017-09-26|南宁富桂精密工业有限公司|The system of interface operation control method and application this method| TWI609314B|2016-03-17|2017-12-21|鴻海精密工業股份有限公司|Interface operating control system method using the same| JP6724981B2|2016-03-31|2020-07-15|京セラドキュメントソリューションズ株式会社|Display device| KR20180062832A|2016-12-01|2018-06-11|주식회사 하이딥|Touch input method for providing uer interface and apparatus| JP6946857B2|2017-08-24|2021-10-13|富士フイルムビジネスイノベーション株式会社|Information processing equipment and programs| CN108491139B|2018-02-13|2020-12-25|广州视源电子科技股份有限公司|Object fixing method and device, terminal equipment and storage medium| CN108446062A|2018-02-13|2018-08-24|广州视源电子科技股份有限公司|A kind of object fixing means, device, terminal device and storage medium| CN109539486A|2018-07-16|2019-03-29|珠海磐磊智能科技有限公司|The control method and system of regulator control system| JP2020052681A|2018-09-26|2020-04-02|シュナイダーエレクトリックホールディングス株式会社|Operation processing device| CN109743445B|2018-12-20|2020-07-17|惠州Tcl移动通信有限公司|Application icon adjusting method and device, storage medium and electronic equipment|
法律状态:
2015-08-25| B03A| Publication of a patent application or of a certificate of addition of invention [chapter 3.1 patent gazette]| 2018-12-04| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2019-11-19| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2020-08-25| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2021-02-23| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 28/06/2013, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 KR1020120077021A|KR101984154B1|2012-07-16|2012-07-16|Control method for terminal using touch and gesture input and terminal thereof| KR10-2012-0077021|2012-07-16| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|